28 research outputs found

    Peer-reviewed open research data: results of a pilot

    Get PDF
    Peer review of publications is at the core of science and primarily seen as instrument for ensuring research quality. However, it is less common to value independently the quality of the underlying data as well. In the light of the “data deluge” it makes sense to extend peer review to the data itself and this way evaluate the degree to which the data are fit for re-use. This paper describes a pilot study at EASY - the electronic archive for (open) research data at our institution. In EASY, researchers can archive their data and add metadata themselves. Devoted to open access and data sharing, at the archive we are interested in further enriching these metadata with peer reviews. As pilot we established a workflow where researchers who have downloaded data sets from the archive were asked to review the downloaded data set. This paper describes the details of the pilot including the findings, both quantitative and qualitative. Finally we discuss issues that need to be solved when such a pilot should be turned into structural peer review functionality of the archiving system

    The CODATA-RDA Data Steward School

    Get PDF
    Given the expected increase in demand for Data Stewards and Data Stewardship skills it is clear that there is a need to develop training, education and CPD (continuous professional development) in this area. In this paper a brief introduction is provided to the origin of definitions of Data Stewardship. Also it notes the present tendency towards equivalence between Data Stewardship skills and FAIR principles. It then focuses on one specific training event – the pilot Data Stewardship strand of the CODATA-RDA Research Data Science schools that by the time of the IDCC meeting will have been held in Trieste in August 2019. The paper will discuss the overall curriculum for the pilot school, how it matches with the FAIR4S framework, and plans for getting feedback from the students. Finally, the paper discuss future plans for the school, in particular how to deepen the integration between the Data Stewardship strand with the Early Career Researcher strand. [This paper is a conference pre-print presented at IDCC 2020 after lightweight peer review.

    Experiences with reviewing data management plans - an LCRDM survey

    Get PDF
    Deze dataset bevat de resultaten van een survey over het reviewen van datamanagementplannen (DMPs). Het survey is uitgevoerd door de werkgroep Onderzoeksondersteuning en Advies van het Landelijk Coördinatiepunt Research Data Management (LCRDM). Zestig respondenten deelden hun ervaringen en feedback op DMPs via deze survey. De dataset bevat een beknopt rapport, de surveyvragen en de geanonimiseerde data

    FAIR data work @ DANS - iPRES 2019 Amsterdam

    No full text
    Ever since the origin of the FAIR data guiding principles, various members of the DANS staff have been involved in a variety of activities on thinking about their implications and implementing them. This paper presents an overview of the fruits of our work so far and sketches our ideas for the years to come. We were involved as co-authors of the original publication on the FAIR principles, developed and tested FAIR metrics, worked on tools to rate the FAIRness of datasets, on a FAIR checklist for researchers, we evaluated how our own data archives score on FAIRness, we compared the principles to the requirements of the Data Seal of Approval and the CoreTrustSeal, explored the applicability of the FAIR principles to Software Sustainability, prepared guidelines for FAIR data management, and we lead the prominent Horizon 2020 FAIRsFAIR project in the context of the European Open Science Cloud

    Capability Maturity & Community Engagement Design Statement

    No full text
    This paper proposes a design approach to evaluating capability in particular areas of focus within an entity, and the level of adoption, practice and collaboration displayed related to policy, and standardisation. Understanding our levels of capability in particular areas and how these contribute to overall maturity allows us to understand our current status and to identify and invest in areas where we want to improve and monitor our progress. This applies to both self-assessment and external evaluation. The approach can be adjusted to address a range of entities including organisations, parts of organisations and component services. It should be possible to apply these approaches to repositories and other organisations providing data infrastructure (RDI) and also to organisations providing funding (RFO) or performing research (RPO). In addition to existing capability and maturity models there are several under development or review within the EOSC and trustworthy digital repository (TDR) space. The model designers need the freedom to develop and evolve independently, but recognize the risk to implementers of having a confusing range of non-interoperable and resource-intensive systems to choose from. This text reflects the evolving design statement being developed within the FAIRsFAIR4 project
    corecore